Lévy's Zero–one Law
   HOME

TheInfoList



OR:

In mathematicsspecifically, in the theory of stochastic processesDoob's martingale convergence theorems are a collection of results on the
limits Limit or Limits may refer to: Arts and media * ''Limit'' (manga), a manga by Keiko Suenobu * ''Limit'' (film), a South Korean film * Limit (music), a way to characterize harmony * "Limit" (song), a 2016 single by Luna Sea * "Limits", a 2019 ...
of supermartingales, named after the American mathematician Joseph L. Doob. Informally, the martingale convergence theorem typically refers to the result that any supermartingale satisfying a certain boundedness condition must converge. One may think of supermartingales as the random variable analogues of non-increasing sequences; from this perspective, the martingale convergence theorem is a random variable analogue of the
monotone convergence theorem In the mathematical field of real analysis, the monotone convergence theorem is any of a number of related theorems proving the convergence of monotonic sequences (sequences that are non-decreasing or non-increasing) that are also bounded. Infor ...
, which states that any bounded monotone sequence converges. There are symmetric results for submartingales, which are analogous to non-decreasing sequences.


Statement for discrete-time martingales

A common formulation of the martingale convergence theorem for discrete-time martingales is the following. Let X_1, X_2, X_3, \dots be a supermartingale. Suppose that the supermartingale is bounded in the sense that : \sup_ \operatorname _t^-< \infty where X_t^- is the negative part of X_t , defined by X_t^- = -\min(X_t, 0) . Then the sequence converges
almost surely In probability theory, an event is said to happen almost surely (sometimes abbreviated as a.s.) if it happens with probability 1 (or Lebesgue measure 1). In other words, the set of possible exceptions may be non-empty, but it has probability 0 ...
to a random variable X with finite expectation. There is a symmetric statement for submartingales with bounded expectation of the positive part. A supermartingale is a stochastic analogue of a non-increasing sequence, and the condition of the theorem is analogous to the condition in the monotone convergence theorem that the sequence be bounded from below. The condition that the martingale is bounded is essential; for example, an unbiased \pm 1 random walk is a martingale but does not converge. As intuition, there are two reasons why a sequence may fail to converge. It may go off to infinity, or it may oscillate. The boundedness condition prevents the former from happening. The latter is impossible by a "gambling" argument. Specifically, consider a stock market game in which at time t, the stock has price X_t . There is no strategy for buying and selling the stock over time, always holding a non-negative amount of stock, which has positive expected profit in this game. The reason is that at each time the expected change in stock price, given all past information, is at most zero (by definition of a supermartingale). But if the prices were to oscillate without converging, then there would be a strategy with positive expected profit: loosely, buy low and sell high. This argument can be made rigorous to prove the result.


Proof sketch

The proof is simplified by making the (stronger) assumption that the supermartingale is uniformly bounded; that is, there is a constant M such that , X_n, \leq M always holds. In the event that the sequence X_1,X_2,\dots does not converge, then \liminf X_n and \limsup X_n differ. If also the sequence is bounded, then there are some real numbers a and b such that a < b and the sequence crosses the interval ,b/math> infinitely often. That is, the sequence is eventually less than a, and at a later time exceeds b, and at an even later time is less than a, and so forth ad infinitum. These periods where the sequence starts below a and later exceeds b are called "upcrossings". Consider a stock market game in which at time t, one may buy or sell shares of the stock at price X_t . On the one hand, it can be shown from the definition of a supermartingale that for any N \in \mathbf there is no strategy which maintains a non-negative amount of stock and has positive expected profit after playing this game for N steps. On the other hand, if the prices cross a fixed interval ,b/math> very often, then the following strategy seems to do well: buy the stock when the price drops below a, and sell it when the price exceeds b. Indeed, if u_N is the number of upcrossings in the sequence by time N, then the profit at time N is at least (b-a)u_N - 2M: each upcrossing provides at least b-a profit, and if the last action was a "buy", then in the worst case the buying price was a \leq M and the current price is -M . But any strategy has expected profit at most 0, so necessarily : \operatorname \big _N\big\leq \frac. By the monotone convergence theorem for expectations, this means that : \operatorname \big lim_ u_N \bigleq \frac , so the expected number of upcrossings in the whole sequence is finite. It follows that the infinite-crossing event for interval ,b/math> occurs with probability 0. By a union bound over all rational a and b, with probability 1, no interval exists which is crossed infinitely often. If for all a, b \in \mathbf there are finitely many upcrossings of interval ,b, then the limit inferior and limit superior of the sequence must agree, so the sequence must converge. This shows that the martingale converges with probability 1.


Failure of convergence in mean

Under the conditions of the martingale convergence theorem given above, it is not necessarily true that the supermartingale (X_n)_ converges in mean (i.e. that \lim_ \operatorname stopping_time In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time, Markov moment, optional stopping time or optional time ) is a specific type of “random time”: a random variable whose value is inter ...
_with_respect_to_the_martingale_(X_n)_,_so_(Y_n)__is_also_a_martingale,_referred_to_as_a_ stopped_martingale._In_particular,__(Y_n)__is_a_supermartingale_which_is_bounded_below,_so_by_the_martingale_convergence_theorem_it_converges_pointwise_almost_surely_to_a_random_variable_Y._But_if_Y_n_>_0__then_Y__=_Y_n_\pm_1,_so__Y__is_almost_surely_zero. This_means_that__\operatorname =_0_._However,__\operatorname _n=_1__for_every__n_\geq_1,_since__(Y_n)___is_a_random_walk_which_starts_at__1__and_subsequently_makes_mean-zero_moves__(alternately,_note_that__\operatorname _n=_\operatorname _0=_1__since_(Y_n)__is_a_martingale)._Therefore__(Y_n)___cannot_converge_to_Y_in_mean._Moreover,_if__(Y_n)___were_to_converge_in_mean_to_any_random_variable__R_,_then_ some_subsequence_converges_to__R__almost_surely._So_by_the_above_argument__R_=_0__almost_surely,_which_contradicts_convergence_in_mean.


_Statements_for_the_general_case

In_the_following,__(\Omega,_F,_F_*,_\mathbf)__will_be_a_
filtered_probability_space Filtration is a physical separation process that separates solid matter and fluid from a mixture using a ''filter medium'' that has a complex structure through which only the fluid can pass. Solid particles that cannot pass through the filter ...
_where__F_*_=_(F_t)__,_and__N:_
continuous Continuity or continuous may refer to: Mathematics * Continuity (mathematics), the opposing concept to discreteness; common examples include ** Continuous probability distribution or random variable in probability and statistics ** Continuous ...
_supermartingale_with_respect_to_the_filtration__F_*_;__in_other_words,_for_all__0_\leq_s_\leq_t_<_+\infty_, :N_s_\geq_\operatorname_\big[_N_t_\mid_F_s_\big].


_Doob's_first_martingale_convergence_theorem

Doob's_first_martingale_convergence_theorem_provides_a_sufficient_condition_for_the_random_variables_N_t_to_have_a_limit_as_t\to+\infty_in_a_pointwise_sense,_i.e._for_each_\omega_in_the_
sample_space In probability theory, the sample space (also called sample description space, possibility space, or outcome space) of an experiment or random trial is the set of all possible outcomes or results of that experiment. A sample space is usually den ...
_\Omega_individually. For_t\geq_0,_let_N_t^-_=_\max(-N_t,0)_and_suppose_that :\sup__\operatorname_\big N_t^_\big<_+_\infty. Then_the_pointwise_limit :N(\omega)_=_\lim__N_t_(\omega) exists_and_is_finite_for_\mathbf- almost_all_\omega_\in_\Omega.


_Doob's_second_martingale_convergence_theorem

It_is_important_to_note_that_the_convergence_in_Doob's_first_martingale_convergence_theorem_is_pointwise,_not_uniform,_and_is_unrelated_to_convergence_in_mean_square,_or_indeed_in_any_ ''Lp''_space._In_order_to_obtain_convergence_in_''L''1_(i.e.,_
convergence_in_mean In probability theory, there exist several different notions of convergence of random variables. The convergence of sequences of random variables to some limit random variable is an important concept in probability theory, and its applications to ...
),_one_requires_uniform_integrability_of_the_random_variables_N_t._By_
Chebyshev's_inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
,_convergence_in_''L''1_implies_convergence_in_probability_and_convergence_in_distribution. The_following_are_equivalent: *_(N_t)__is_ uniformly_integrable,_i.e. ::\lim__\sup__\int__\left, _N_t_(\omega)_\_\,_\mathrm_\mathbf_(\omega)_=_0; *_there_exists_an_
integrable In mathematics, integrability is a property of certain dynamical systems. While there are several distinct formal definitions, informally speaking, an integrable system is a dynamical system with sufficiently many conserved quantities, or first ...
_random_variable_N_\in_L^1(\Omega,\mathbf;\mathbf)_such_that_N_t_\to_N_as_t\to\infty_both_\mathbf-
almost_surely In probability theory, an event is said to happen almost surely (sometimes abbreviated as a.s.) if it happens with probability 1 (or Lebesgue measure 1). In other words, the set of possible exceptions may be non-empty, but it has probability 0 ...
_and_in_L^1(\Omega,\mathbf;\mathbf),_i.e. ::\operatorname_\left _N_t_-_N_\_\right=_\int_\Omega_\left, _N_t_(\omega)_-_N_(\omega)_\_\,_\mathrm_\mathbf_(\omega)_\to_0_\text_t_\to_+_\infty.


_Doob's_upcrossing_inequality

The_following_result,_called_Doob's_upcrossing_inequality_or,_sometimes,_Doob's_upcrossing_lemma,_is_used_in_proving_Doob's_martingale_convergence_theorems._A_ "gambling"_argument_shows_that_for_uniformly_bounded_supermartingales,_the_number_of_upcrossings_is_bounded;_the_upcrossing_lemma_generalizes_this_argument_to_supermartingales_with_bounded_expectation_of_their_negative_parts. Let_N_be_a_natural_number._Let_(X_n)__be_a_supermartingale_with_respect_to_a_ filtration_(\mathcal_n)_._Let_a,_b_be_two_real_numbers_with_a_<_b._Define_the_random_variables_(U_n)__so_that_U_n_is_the_maximum_number_of_disjoint_intervals__ _,_n__with__n__\leq_n_,_such_that_X__<_a_<_b_<_X__._These_are_called_upcrossings_with_respect_to_interval___,b._Then :_(b_-_a)_\operatorname _n\le_\operatorname X_n_-_a)^-\quad where__X^-__is_the_negative_part_of__X_,_defined_by__X^-_=_-\min(X,_0)_.


_Applications


_Convergence_in_''L''''p''

Let_M:
continuous_ Continuity_or_continuous_may_refer_to: __Mathematics_ *_Continuity_(mathematics),_the_opposing_concept_to_discreteness;_common_examples_include **_Continuous_probability_distribution_or_random_variable_in_probability_and_statistics **__Continuous__...
_martingale_such_that :\sup__\operatorname_\big[_\big.html" ;"title="sample_continuous_process.html" "title=",\infty)_\times_\Omega_\to_\mathbf_be_a_sample_continuous_process">continuous Continuity or continuous may refer to: Mathematics * Continuity (mathematics), the opposing concept to discreteness; common examples include ** Continuous probability distribution or random variable in probability and statistics ** Continuous ...
_martingale_such_that :\sup__\operatorname_\big[_\big">_M_t_\big, ^p_\big<_+_\infty for_some_p>1._Then_there_exists_a_random_variable_M_\in_L^p(\Omega,\mathbf;\mathbf)_such_that_M_t_\to_M_as_t\to_+\infty_both_\mathbf-almost_surely_and_in_L^p(\Omega,\mathbf;\mathbf). The_statement_for_discrete-time_martingales_is_essentially_identical,_with_the_obvious_difference_that_the_continuity_assumption_is_no_longer_necessary.


__Lévy's_zero–one_law

Doob's_martingale_convergence_theorems_imply_that_conditional_expectations_also_have_a_convergence_property. Let_(\Omega,F,\mathbf)_be_a_
probability_space In probability theory, a probability space or a probability triple (\Omega, \mathcal, P) is a mathematical construct that provides a formal model of a random process or "experiment". For example, one can define a probability space which models t ...
_and_let_X_be_a_random_variable_in_L^1._Let_F_*_=_(F_k)__be_any_ filtration_of_F,_and_define_F_\infty_to_be_the_minimal_ ''σ''-algebra_generated_by_(F_k)_._Then :\operatorname_\big X_\mid_F_k_\big\to_\operatorname_\big X_\mid_F_\infty_\big\text_k_\to_\infty both_\mathbf-almost_surely_and_in_L^1. This_result_is_usually_called_Lévy's_zero–one_law_or_Levy's_upwards_theorem._The_reason_for_the_name_is_that_if_A_is_an_event_in_F_\infty,_then_the_theorem_says_that_\mathbf A_\mid_F_k_\to_\mathbf_A__almost_surely,_i.e.,_the_limit_of_the_probabilities_is_0_or_1._In_plain_language,_if_we_are_learning_gradually_all_the_information_that_determines_the_outcome_of_an_event,_then_we_will_become_gradually_certain_what_the_outcome_will_be._This_sounds_almost_like_a_ tautology,_but_the_result_is_still_non-trivial._For_instance,_it_easily_implies_
Kolmogorov's_zero–one_law In probability theory, Kolmogorov's zero–one law, named in honor of Andrey Nikolaevich Kolmogorov, specifies that a certain type of event, namely a ''tail event of independent σ-algebras'', will either almost surely happen or almost sure ...
,_since_it_says_that_for_any_
tail_event The tail is the section at the rear end of certain kinds of animals’ bodies; in general, the term refers to a distinct, flexible appendage to the torso. It is the part of the body that corresponds roughly to the sacrum and coccyx in mammals, r ...
_''A'',_we_must_have_\mathbf A_=_\mathbf_A__almost_surely,_hence_\mathbf A_\in_\_. Similarly_we_have_the_Levy's_downwards_theorem_: Let_(\Omega,F,\mathbf)_be_a_
probability_space In probability theory, a probability space or a probability triple (\Omega, \mathcal, P) is a mathematical construct that provides a formal model of a random process or "experiment". For example, one can define a probability space which models t ...
_and_let_X_be_a_random_variable_in_L^1._Let_(F_k)__be_any_decreasing_sequence_of_sub-sigma_algebras_of_F,_and_define_F_\infty_to_be_the_intersection._Then :\operatorname_\big X_\mid_F_k_\big\to_\operatorname_\big X_\mid_F_\infty_\big\text_k_\to_\infty both_\mathbf-almost_surely_and_in_L^1.


_See_also

*
Backwards_martingale_convergence_theorem Backward or Backwards is a relative direction. Backwards or Sdrawkcab (the word "backwards" with its letters reversed) may also refer to: * Backwards (Red Dwarf), "Backwards" (''Red Dwarf''), episode of sci-fi TV sitcom ''Red Dwarf'' ** Backwar ...


_References

*_{{cite_book_ , _last_=_Øksendal , _first_=_Bernt_K. , _authorlink_=_Bernt_Øksendal , _title_=_Stochastic_Differential_Equations:_An_Introduction_with_Applications_ , _edition_=_Sixth , _publisher=Springer , _location_=_Berlin_ , _year_=_2003_ , _isbn_=_3-540-04758-1 , _url_=_https://books.google.com/books?id=EQZEAAAAQBAJ _(See_Appendix_C) Probability_theorems Martingale_theoryhtml" ;"title="X_n - X, ] = 0 ). As an example, let (X_n)_ be a \pm 1 random walk with X_0 = 1. Let N be the first time when X_n = 0, and let (Y_n)_ be the stochastic process defined by Y_n := X_. Then N is a
stopping time In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time, Markov moment, optional stopping time or optional time ) is a specific type of “random time”: a random variable whose value is inter ...
with respect to the martingale (X_n)_, so (Y_n)_ is also a martingale, referred to as a stopped martingale. In particular, (Y_n)_ is a supermartingale which is bounded below, so by the martingale convergence theorem it converges pointwise almost surely to a random variable Y. But if Y_n > 0 then Y_ = Y_n \pm 1, so Y is almost surely zero. This means that \operatorname = 0 . However, \operatorname _n= 1 for every n \geq 1, since (Y_n)_ is a random walk which starts at 1 and subsequently makes mean-zero moves (alternately, note that \operatorname _n= \operatorname _0= 1 since (Y_n)_ is a martingale). Therefore (Y_n)_ cannot converge to Y in mean. Moreover, if (Y_n)_ were to converge in mean to any random variable R , then some subsequence converges to R almost surely. So by the above argument R = 0 almost surely, which contradicts convergence in mean.


Statements for the general case

In the following, (\Omega, F, F_*, \mathbf) will be a
filtered probability space Filtration is a physical separation process that separates solid matter and fluid from a mixture using a ''filter medium'' that has a complex structure through which only the fluid can pass. Solid particles that cannot pass through the filter ...
where F_* = (F_t)_ , and N:
continuous Continuity or continuous may refer to: Mathematics * Continuity (mathematics), the opposing concept to discreteness; common examples include ** Continuous probability distribution or random variable in probability and statistics ** Continuous ...
supermartingale with respect to the filtration F_* ; in other words, for all 0 \leq s \leq t < +\infty , :N_s \geq \operatorname \big[ N_t \mid F_s \big].


Doob's first martingale convergence theorem

Doob's first martingale convergence theorem provides a sufficient condition for the random variables N_t to have a limit as t\to+\infty in a pointwise sense, i.e. for each \omega in the
sample space In probability theory, the sample space (also called sample description space, possibility space, or outcome space) of an experiment or random trial is the set of all possible outcomes or results of that experiment. A sample space is usually den ...
\Omega individually. For t\geq 0, let N_t^- = \max(-N_t,0) and suppose that :\sup_ \operatorname \big N_t^ \big< + \infty. Then the pointwise limit :N(\omega) = \lim_ N_t (\omega) exists and is finite for \mathbf- almost all \omega \in \Omega.


Doob's second martingale convergence theorem

It is important to note that the convergence in Doob's first martingale convergence theorem is pointwise, not uniform, and is unrelated to convergence in mean square, or indeed in any ''Lp'' space. In order to obtain convergence in ''L''1 (i.e.,
convergence in mean In probability theory, there exist several different notions of convergence of random variables. The convergence of sequences of random variables to some limit random variable is an important concept in probability theory, and its applications to ...
), one requires uniform integrability of the random variables N_t. By
Chebyshev's inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
, convergence in ''L''1 implies convergence in probability and convergence in distribution. The following are equivalent: * (N_t)_ is uniformly integrable, i.e. ::\lim_ \sup_ \int_ \left, N_t (\omega) \ \, \mathrm \mathbf (\omega) = 0; * there exists an
integrable In mathematics, integrability is a property of certain dynamical systems. While there are several distinct formal definitions, informally speaking, an integrable system is a dynamical system with sufficiently many conserved quantities, or first ...
random variable N \in L^1(\Omega,\mathbf;\mathbf) such that N_t \to N as t\to\infty both \mathbf-
almost surely In probability theory, an event is said to happen almost surely (sometimes abbreviated as a.s.) if it happens with probability 1 (or Lebesgue measure 1). In other words, the set of possible exceptions may be non-empty, but it has probability 0 ...
and in L^1(\Omega,\mathbf;\mathbf), i.e. ::\operatorname \left N_t - N \ \right= \int_\Omega \left, N_t (\omega) - N (\omega) \ \, \mathrm \mathbf (\omega) \to 0 \text t \to + \infty.


Doob's upcrossing inequality

The following result, called Doob's upcrossing inequality or, sometimes, Doob's upcrossing lemma, is used in proving Doob's martingale convergence theorems. A "gambling" argument shows that for uniformly bounded supermartingales, the number of upcrossings is bounded; the upcrossing lemma generalizes this argument to supermartingales with bounded expectation of their negative parts. Let N be a natural number. Let (X_n)_ be a supermartingale with respect to a filtration (\mathcal_n)_. Let a, b be two real numbers with a < b. Define the random variables (U_n)_ so that U_n is the maximum number of disjoint intervals _, n_ with n_ \leq n , such that X_ < a < b < X_ . These are called upcrossings with respect to interval ,b. Then : (b - a) \operatorname _n\le \operatorname X_n - a)^-\quad where X^- is the negative part of X , defined by X^- = -\min(X, 0) .


Applications


Convergence in ''L''''p''

Let M:
continuous_ Continuity_or_continuous_may_refer_to: __Mathematics_ *_Continuity_(mathematics),_the_opposing_concept_to_discreteness;_common_examples_include **_Continuous_probability_distribution_or_random_variable_in_probability_and_statistics **__Continuous__...
_martingale_such_that :\sup__\operatorname_\big[_\big.html" ;"title="sample_continuous_process.html" "title=",\infty) \times \Omega \to \mathbf be a sample continuous process">continuous Continuity or continuous may refer to: Mathematics * Continuity (mathematics), the opposing concept to discreteness; common examples include ** Continuous probability distribution or random variable in probability and statistics ** Continuous ...
martingale such that :\sup_ \operatorname \big[ \big"> M_t \big, ^p \big< + \infty for some p>1. Then there exists a random variable M \in L^p(\Omega,\mathbf;\mathbf) such that M_t \to M as t\to +\infty both \mathbf-almost surely and in L^p(\Omega,\mathbf;\mathbf). The statement for discrete-time martingales is essentially identical, with the obvious difference that the continuity assumption is no longer necessary.


Lévy's zero–one law

Doob's martingale convergence theorems imply that conditional expectations also have a convergence property. Let (\Omega,F,\mathbf) be a
probability space In probability theory, a probability space or a probability triple (\Omega, \mathcal, P) is a mathematical construct that provides a formal model of a random process or "experiment". For example, one can define a probability space which models t ...
and let X be a random variable in L^1. Let F_* = (F_k)_ be any filtration of F, and define F_\infty to be the minimal ''σ''-algebra generated by (F_k)_. Then :\operatorname \big X \mid F_k \big\to \operatorname \big X \mid F_\infty \big\text k \to \infty both \mathbf-almost surely and in L^1. This result is usually called Lévy's zero–one law or Levy's upwards theorem. The reason for the name is that if A is an event in F_\infty, then the theorem says that \mathbf A \mid F_k \to \mathbf_A almost surely, i.e., the limit of the probabilities is 0 or 1. In plain language, if we are learning gradually all the information that determines the outcome of an event, then we will become gradually certain what the outcome will be. This sounds almost like a tautology, but the result is still non-trivial. For instance, it easily implies
Kolmogorov's zero–one law In probability theory, Kolmogorov's zero–one law, named in honor of Andrey Nikolaevich Kolmogorov, specifies that a certain type of event, namely a ''tail event of independent σ-algebras'', will either almost surely happen or almost sure ...
, since it says that for any
tail event The tail is the section at the rear end of certain kinds of animals’ bodies; in general, the term refers to a distinct, flexible appendage to the torso. It is the part of the body that corresponds roughly to the sacrum and coccyx in mammals, r ...
''A'', we must have \mathbf A = \mathbf_A almost surely, hence \mathbf A \in \ . Similarly we have the Levy's downwards theorem : Let (\Omega,F,\mathbf) be a
probability space In probability theory, a probability space or a probability triple (\Omega, \mathcal, P) is a mathematical construct that provides a formal model of a random process or "experiment". For example, one can define a probability space which models t ...
and let X be a random variable in L^1. Let (F_k)_ be any decreasing sequence of sub-sigma algebras of F, and define F_\infty to be the intersection. Then :\operatorname \big X \mid F_k \big\to \operatorname \big X \mid F_\infty \big\text k \to \infty both \mathbf-almost surely and in L^1.


See also

*
Backwards martingale convergence theorem Backward or Backwards is a relative direction. Backwards or Sdrawkcab (the word "backwards" with its letters reversed) may also refer to: * Backwards (Red Dwarf), "Backwards" (''Red Dwarf''), episode of sci-fi TV sitcom ''Red Dwarf'' ** Backwar ...


References

* {{cite book , last = Øksendal , first = Bernt K. , authorlink = Bernt Øksendal , title = Stochastic Differential Equations: An Introduction with Applications , edition = Sixth , publisher=Springer , location = Berlin , year = 2003 , isbn = 3-540-04758-1 , url = https://books.google.com/books?id=EQZEAAAAQBAJ (See Appendix C) Probability theorems Martingale theory>X_n - X, = 0 ). As an example, let (X_n)_ be a \pm 1 random walk with X_0 = 1. Let N be the first time when X_n = 0, and let (Y_n)_ be the stochastic process defined by Y_n := X_. Then N is a
stopping time In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time, Markov moment, optional stopping time or optional time ) is a specific type of “random time”: a random variable whose value is inter ...
with respect to the martingale (X_n)_, so (Y_n)_ is also a martingale, referred to as a stopped martingale. In particular, (Y_n)_ is a supermartingale which is bounded below, so by the martingale convergence theorem it converges pointwise almost surely to a random variable Y. But if Y_n > 0 then Y_ = Y_n \pm 1, so Y is almost surely zero. This means that \operatorname = 0 . However, \operatorname _n= 1 for every n \geq 1, since (Y_n)_ is a random walk which starts at 1 and subsequently makes mean-zero moves (alternately, note that \operatorname _n= \operatorname _0= 1 since (Y_n)_ is a martingale). Therefore (Y_n)_ cannot converge to Y in mean. Moreover, if (Y_n)_ were to converge in mean to any random variable R , then some subsequence converges to R almost surely. So by the above argument R = 0 almost surely, which contradicts convergence in mean.


Statements for the general case

In the following, (\Omega, F, F_*, \mathbf) will be a
filtered probability space Filtration is a physical separation process that separates solid matter and fluid from a mixture using a ''filter medium'' that has a complex structure through which only the fluid can pass. Solid particles that cannot pass through the filter ...
where F_* = (F_t)_ , and N:
continuous Continuity or continuous may refer to: Mathematics * Continuity (mathematics), the opposing concept to discreteness; common examples include ** Continuous probability distribution or random variable in probability and statistics ** Continuous ...
supermartingale with respect to the filtration F_* ; in other words, for all 0 \leq s \leq t < +\infty , :N_s \geq \operatorname \big[ N_t \mid F_s \big].


Doob's first martingale convergence theorem

Doob's first martingale convergence theorem provides a sufficient condition for the random variables N_t to have a limit as t\to+\infty in a pointwise sense, i.e. for each \omega in the
sample space In probability theory, the sample space (also called sample description space, possibility space, or outcome space) of an experiment or random trial is the set of all possible outcomes or results of that experiment. A sample space is usually den ...
\Omega individually. For t\geq 0, let N_t^- = \max(-N_t,0) and suppose that :\sup_ \operatorname \big N_t^ \big< + \infty. Then the pointwise limit :N(\omega) = \lim_ N_t (\omega) exists and is finite for \mathbf- almost all \omega \in \Omega.


Doob's second martingale convergence theorem

It is important to note that the convergence in Doob's first martingale convergence theorem is pointwise, not uniform, and is unrelated to convergence in mean square, or indeed in any ''Lp'' space. In order to obtain convergence in ''L''1 (i.e.,
convergence in mean In probability theory, there exist several different notions of convergence of random variables. The convergence of sequences of random variables to some limit random variable is an important concept in probability theory, and its applications to ...
), one requires uniform integrability of the random variables N_t. By
Chebyshev's inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
, convergence in ''L''1 implies convergence in probability and convergence in distribution. The following are equivalent: * (N_t)_ is uniformly integrable, i.e. ::\lim_ \sup_ \int_ \left, N_t (\omega) \ \, \mathrm \mathbf (\omega) = 0; * there exists an
integrable In mathematics, integrability is a property of certain dynamical systems. While there are several distinct formal definitions, informally speaking, an integrable system is a dynamical system with sufficiently many conserved quantities, or first ...
random variable N \in L^1(\Omega,\mathbf;\mathbf) such that N_t \to N as t\to\infty both \mathbf-
almost surely In probability theory, an event is said to happen almost surely (sometimes abbreviated as a.s.) if it happens with probability 1 (or Lebesgue measure 1). In other words, the set of possible exceptions may be non-empty, but it has probability 0 ...
and in L^1(\Omega,\mathbf;\mathbf), i.e. ::\operatorname \left N_t - N \ \right= \int_\Omega \left, N_t (\omega) - N (\omega) \ \, \mathrm \mathbf (\omega) \to 0 \text t \to + \infty.


Doob's upcrossing inequality

The following result, called Doob's upcrossing inequality or, sometimes, Doob's upcrossing lemma, is used in proving Doob's martingale convergence theorems. A "gambling" argument shows that for uniformly bounded supermartingales, the number of upcrossings is bounded; the upcrossing lemma generalizes this argument to supermartingales with bounded expectation of their negative parts. Let N be a natural number. Let (X_n)_ be a supermartingale with respect to a filtration (\mathcal_n)_. Let a, b be two real numbers with a < b. Define the random variables (U_n)_ so that U_n is the maximum number of disjoint intervals _, n_ with n_ \leq n , such that X_ < a < b < X_ . These are called upcrossings with respect to interval ,b. Then : (b - a) \operatorname _n\le \operatorname X_n - a)^-\quad where X^- is the negative part of X , defined by X^- = -\min(X, 0) .


Applications


Convergence in ''L''''p''

Let M:
continuous_ Continuity_or_continuous_may_refer_to: __Mathematics_ *_Continuity_(mathematics),_the_opposing_concept_to_discreteness;_common_examples_include **_Continuous_probability_distribution_or_random_variable_in_probability_and_statistics **__Continuous__...
_martingale_such_that :\sup__\operatorname_\big[_\big.html" ;"title="sample_continuous_process.html" "title=",\infty) \times \Omega \to \mathbf be a sample continuous process">continuous Continuity or continuous may refer to: Mathematics * Continuity (mathematics), the opposing concept to discreteness; common examples include ** Continuous probability distribution or random variable in probability and statistics ** Continuous ...
martingale such that :\sup_ \operatorname \big[ \big"> M_t \big, ^p \big< + \infty for some p>1. Then there exists a random variable M \in L^p(\Omega,\mathbf;\mathbf) such that M_t \to M as t\to +\infty both \mathbf-almost surely and in L^p(\Omega,\mathbf;\mathbf). The statement for discrete-time martingales is essentially identical, with the obvious difference that the continuity assumption is no longer necessary.


Lévy's zero–one law

Doob's martingale convergence theorems imply that conditional expectations also have a convergence property. Let (\Omega,F,\mathbf) be a
probability space In probability theory, a probability space or a probability triple (\Omega, \mathcal, P) is a mathematical construct that provides a formal model of a random process or "experiment". For example, one can define a probability space which models t ...
and let X be a random variable in L^1. Let F_* = (F_k)_ be any filtration of F, and define F_\infty to be the minimal ''σ''-algebra generated by (F_k)_. Then :\operatorname \big X \mid F_k \big\to \operatorname \big X \mid F_\infty \big\text k \to \infty both \mathbf-almost surely and in L^1. This result is usually called Lévy's zero–one law or Levy's upwards theorem. The reason for the name is that if A is an event in F_\infty, then the theorem says that \mathbf A \mid F_k \to \mathbf_A almost surely, i.e., the limit of the probabilities is 0 or 1. In plain language, if we are learning gradually all the information that determines the outcome of an event, then we will become gradually certain what the outcome will be. This sounds almost like a tautology, but the result is still non-trivial. For instance, it easily implies
Kolmogorov's zero–one law In probability theory, Kolmogorov's zero–one law, named in honor of Andrey Nikolaevich Kolmogorov, specifies that a certain type of event, namely a ''tail event of independent σ-algebras'', will either almost surely happen or almost sure ...
, since it says that for any
tail event The tail is the section at the rear end of certain kinds of animals’ bodies; in general, the term refers to a distinct, flexible appendage to the torso. It is the part of the body that corresponds roughly to the sacrum and coccyx in mammals, r ...
''A'', we must have \mathbf A = \mathbf_A almost surely, hence \mathbf A \in \ . Similarly we have the Levy's downwards theorem : Let (\Omega,F,\mathbf) be a
probability space In probability theory, a probability space or a probability triple (\Omega, \mathcal, P) is a mathematical construct that provides a formal model of a random process or "experiment". For example, one can define a probability space which models t ...
and let X be a random variable in L^1. Let (F_k)_ be any decreasing sequence of sub-sigma algebras of F, and define F_\infty to be the intersection. Then :\operatorname \big X \mid F_k \big\to \operatorname \big X \mid F_\infty \big\text k \to \infty both \mathbf-almost surely and in L^1.


See also

*
Backwards martingale convergence theorem Backward or Backwards is a relative direction. Backwards or Sdrawkcab (the word "backwards" with its letters reversed) may also refer to: * Backwards (Red Dwarf), "Backwards" (''Red Dwarf''), episode of sci-fi TV sitcom ''Red Dwarf'' ** Backwar ...


References

* {{cite book , last = Øksendal , first = Bernt K. , authorlink = Bernt Øksendal , title = Stochastic Differential Equations: An Introduction with Applications , edition = Sixth , publisher=Springer , location = Berlin , year = 2003 , isbn = 3-540-04758-1 , url = https://books.google.com/books?id=EQZEAAAAQBAJ (See Appendix C) Probability theorems Martingale theory